513 research outputs found

    On Index Coding and Graph Homomorphism

    Full text link
    In this work, we study the problem of index coding from graph homomorphism perspective. We show that the minimum broadcast rate of an index coding problem for different variations of the problem such as non-linear, scalar, and vector index code, can be upper bounded by the minimum broadcast rate of another index coding problem when there exists a homomorphism from the complement of the side information graph of the first problem to that of the second problem. As a result, we show that several upper bounds on scalar and vector index code problem are special cases of one of our main theorems. For the linear scalar index coding problem, it has been shown in [1] that the binary linear index of a graph is equal to a graph theoretical parameter called minrank of the graph. For undirected graphs, in [2] it is shown that minrank(G)=k\mathrm{minrank}(G) = k if and only if there exists a homomorphism from Gˉ\bar{G} to a predefined graph Gˉk\bar{G}_k. Combining these two results, it follows that for undirected graphs, all the digraphs with linear index of at most k coincide with the graphs GG for which there exists a homomorphism from Gˉ\bar{G} to Gˉk\bar{G}_k. In this paper, we give a direct proof to this result that works for digraphs as well. We show how to use this classification result to generate lower bounds on scalar and vector index. In particular, we provide a lower bound for the scalar index of a digraph in terms of the chromatic number of its complement. Using our framework, we show that by changing the field size, linear index of a digraph can be at most increased by a factor that is independent from the number of the nodes.Comment: 5 pages, to appear in "IEEE Information Theory Workshop", 201

    Data Placement And Task Mapping Optimization For Big Data Workflows In The Cloud

    Get PDF
    Data-centric workflows naturally process and analyze a huge volume of datasets. In this new era of Big Data there is a growing need to enable data-centric workflows to perform computations at a scale far exceeding a single workstation\u27s capabilities. Therefore, this type of applications can benefit from distributed high performance computing (HPC) infrastructures like cluster, grid or cloud computing. Although data-centric workflows have been applied extensively to structure complex scientific data analysis processes, they fail to address the big data challenges as well as leverage the capability of dynamic resource provisioning in the Cloud. The concept of “big data workflows” is proposed by our research group as the next generation of data-centric workflow technologies to address the limitations of exist-ing workflows technologies in addressing big data challenges. Executing big data workflows in the Cloud is a challenging problem as work-flow tasks and data are required to be partitioned, distributed and assigned to the cloud execution sites (multiple virtual machines). In running such big data work-flows in the cloud distributed across several physical locations, the workflow execution time and the cloud resource utilization efficiency highly depends on the initial placement and distribution of the workflow tasks and datasets across the multiple virtual machines in the Cloud. Several workflow management systems have been developed for scientists to facilitate the use of workflows; however, data and work-flow task placement issue has not been sufficiently addressed yet. In this dissertation, I propose BDAP strategy (Big Data Placement strategy) for data placement and TPS (Task Placement Strategy) for task placement, which improve workflow performance by minimizing data movement across multiple virtual machines in the Cloud during the workflow execution. In addition, I propose CATS (Cultural Algorithm Task Scheduling) for workflow scheduling, which improve workflow performance by minimizing workflow execution cost. In this dissertation, I 1) formalize data and task placement problems in workflows, 2) propose a data placement algorithm that considers both initial input dataset and intermediate datasets obtained during workflow run, 3) propose a task placement algorithm that considers placement of workflow tasks before workflow run, 4) propose a workflow scheduling strategy to minimize the workflow execution cost once the deadline is provided by user and 5)perform extensive experiments in the distributed environment to validate that our proposed strategies provide an effective data and task placement solution to distribute and place big datasets and tasks into the appropriate virtual machines in the Cloud within reasonable time

    Macroprudential Policy: A Summary

    Get PDF
    The 2007 global financial crisis brought sharply into focus the need for macroprudential policy as a means of controlling systemic financial stability. This has become a focal point for policy-makers and numerous central banks, including the Bank of Canada, but it has its drawbacks, particularly here in Canada.As a counterbalance to microprudential policy, the idea of a macroprudential outlook reaches beyond the notion that as long as every banking institution is healthy, financial stability is assured. Macroprudential policy recognizes that all those financial institutions are linked, and that stability at the individual level may translate to fragility and uncertainty at the macro level.There are two approaches to macroprudential policy, and both come with downsides. One approach examines the network factor, in which banks are linked through their inter-connected financial transactions. A domino effect can thus be created; when one bank defaults, it causes a chain reaction down the line, creating instability in other banks in the network. The extent of this contagion of instability can be clearly observed through this model; unfortunately, it requires the use of detailed information typically available only to a limited circle of bank supervisors.The second approach gleans information from bank stock prices in a poorly performing market. This information is easily available and accessed, but the downside is the lack of clear understanding on how exactly these shocks travel through the complex links of the global banking system.Canada’s banking system is small and has only six major banks. However, it is important to understand how they are interconnected and how each individual bank can contribute to overall risk. Not only do banks need to be sufficiently capitalized in the normal business cycle, but it may be worthwhile for the sake of overall financial stability to create mechanisms, as regulators in some countries are doing, that require banks to hold more capital in good economic times so that they can use it as a buffer in case of a downturn. Another important macroprudential tool is to identify how much each bank contributes to systemic risk. This would entail identifying the banks that pose a greater threat to stability and having them hold extra capital. Assigning proper capital requirements is, however, not as straightforward as it may seem as the risk of the banking system changes when capital requirements change. One study has shown that when properly done such a requirement can reduce by one-quarter the probability of a financial crisis.Implementing macroprudential policy in Canada faces some challenges. With both housing prices and the level of Canadians’ personal debt high, sudden corrections to the financial system can create problems. Also, the interconnections between Canadian and foreign banks could result in the former being much more greatly influenced by financial-crisis spillover from the latter, something Canada generally avoided during the 2007 economic meltdown. There’s no consensus as yet on the objectives of macroprudential policy. However, it is a necessary complement to microprudential policy and provides a means of managing systemic risk with the goal of greater global financial stability

    Macroprudential Policy: A Review

    Get PDF
    The severity and longevity of the recession caused by the 2007 financial crisis has highlighted the lack of a reliable macro-based financial regulation framework. As a consequence, addressing the link between the stability of the financial system as a whole and the performance of the overall economy has become a mandate for policymakers and scholars. Many countries have adopted macroprudential tools as policy responses for safeguarding the financial system. This paper provides a literature review of macroprudential policies, its objectives and the challenges that a macro-based framework needs to overcome, such as financial stability, procyclicality, and systemic risk
    corecore